2,957 research outputs found

    Extrapolation-Based Super-Convergent Implicit-Explicit Peer Methods with A-stable Implicit Part

    Get PDF
    In this paper, we extend the implicit-explicit (IMEX) methods of Peer type recently developed in [Lang, Hundsdorfer, J. Comp. Phys., 337:203--215, 2017] to a broader class of two-step methods that allow the construction of super-convergent IMEX-Peer methods with A-stable implicit part. IMEX schemes combine the necessary stability of implicit and low computational costs of explicit methods to efficiently solve systems of ordinary differential equations with both stiff and non-stiff parts included in the source term. To construct super-convergent IMEX-Peer methods with favourable stability properties, we derive necessary and sufficient conditions on the coefficient matrices and apply an extrapolation approach based on already computed stage values. Optimised super-convergent IMEX-Peer methods of order s+1 for s=2,3,4 stages are given as result of a search algorithm carefully designed to balance the size of the stability regions and the extrapolation errors. Numerical experiments and a comparison to other IMEX-Peer methods are included.Comment: 22 pages, 4 figures. arXiv admin note: text overlap with arXiv:1610.0051

    Denominator Bounds and Polynomial Solutions for Systems of q-Recurrences over K(t) for Constant K

    Full text link
    We consider systems A_\ell(t) y(q^\ell t) + ... + A_0(t) y(t) = b(t) of higher order q-recurrence equations with rational coefficients. We extend a method for finding a bound on the maximal power of t in the denominator of arbitrary rational solutions y(t) as well as a method for bounding the degree of polynomial solutions from the scalar case to the systems case. The approach is direct and does not rely on uncoupling or reduction to a first order system. Unlike in the scalar case this usually requires an initial transformation of the system.Comment: 8 page

    Pre main sequence: Accretion & Outflows

    Full text link
    Low-mass pre-main sequence (PMS) stars are strong X-ray sources, because they possess hot corona like their older main-sequence counterparts. Unique to young stars, however, are X-rays from accretion and outflows, and both processes are of pivotal importance for star and planet formation. We describe how X-ray data provide important insight into the physics of accretion and outflows. First, mass accreted from a circumstellar disk onto the stellar surface reaches velocities up to a few hundred km/s, fast enough to generate soft X-rays in the post-shock region of the accretion shock. X-ray observations together with laboratory experiments and numerical simulations show that the accretion geometry is complex in young stars. Specifically, the center of the accretion column is likely surrounded by material shielding the inner flow from view but itself also hot enough to emit X-rays. Second, X-rays are observed in two locations of protostellar jets: an inner stationary emission component probably related to outflow collimation and outer components, which evolve withing years and are likely related to working surfaces where the shock travels through the jet. Jet-powered X-rays appear to trace the fastest jet component and provide novel information on jet launching in young stars. We conclude that X-ray data will continue to be highly important for understanding star and planet formation, because they directly probe the origin of many emission features studied in other wavelength regimes. In addition, future X-ray missions will improve sensitivity and spectral resolution to probe key model parameters (e.g. velocities) in large samples of PMS stars.Comment: Invited chapter for the "Handbook of X-ray and Gamma-ray Astrophysics" (Eds. C. Bambi and A. Santangelo, Springer Nature, 2022), accepted (34 pages, 11 figures

    Evaluating the Effectiveness of Natural Language Inference for Hate Speech Detection in Languages with Limited Labeled Data

    Full text link
    Most research on hate speech detection has focused on English where a sizeable amount of labeled training data is available. However, to expand hate speech detection into more languages, approaches that require minimal training data are needed. In this paper, we test whether natural language inference (NLI) models which perform well in zero- and few-shot settings can benefit hate speech detection performance in scenarios where only a limited amount of labeled data is available in the target language. Our evaluation on five languages demonstrates large performance improvements of NLI fine-tuning over direct fine-tuning in the target language. However, the effectiveness of previous work that proposed intermediate fine-tuning on English data is hard to match. Only in settings where the English training data does not match the test domain, can our customised NLI-formulation outperform intermediate fine-tuning on English. Based on our extensive experiments, we propose a set of recommendations for hate speech detection in languages where minimal labeled training data is available.Comment: 15 pages, 7 figures, Accepted at the 7th Workshop on Online Abuse and Harms (WOAH), ACL 202
    corecore